1,499 research outputs found

    Space-Based Cosmic-Ray and Gamma-Ray Detectors: a Review

    Full text link
    Prepared for the 2014 ISAPP summer school, this review is focused on space-borne and balloon-borne cosmic-ray and gamma-ray detectors. It is meant to introduce the fundamental concepts necessary to understand the instrument performance metrics, how they tie to the design choices and how they can be effectively used in sensitivity studies. While the write-up does not aim at being complete or exhaustive, it is largely self-contained in that related topics such as the basic physical processes governing the interaction of radiation with matter and the near-Earth environment are briefly reviewed.Comment: 86 pages, 70 figures, prepared for the 2014 ISAPP summer school. Change log in the writeup, ancillary material at https://bitbucket.org/lbaldini/crdetector

    Alternative approaches to Long Term Care financing. Distributive implications and sustainability for Italy.

    Get PDF
    In the last decade, many countries have adopted tax schemes specifically aimed at financing programs for Long Term Care (LTC). These mechanisms have important distributional implications both within and across generations. Given the process of demographic ageing, the issue of inter and intra-generational fairness is deeply linked with the problem of the long-term financial equilibrium of an LTC fund. In this paper we first compare, on a microdata sample of the Italian population, the distributive effects (both on current income and across generations) of six alternative approaches to finance an LTC scheme. In particular, we consider a hypothetical LTC scheme (with a size equivalent to that of the German one) to be introduced in Italy and analyse the distributive implications of some tax options, taken from the financing mechanisms implemented or under discussion in Germany, Luxembourg, Japan and Italy.In the second part of the paper we move from a static to a dynamic perspective: we study the long-term sustainability of an hypothetical Pay as You Go (Payg) LTC scheme operating in Italy (that is, assuming the Italian projected demographic trends) under scenarios that consider alternative indexation rules, growth rates of GNP, future incidence of disability among age groups.long term care; distributive effects; tax-benefit model; intertemporal sustainability; trust fund

    Influence of Beam Broadening on the Accuracy of Radar Polarimetric Rainfall Estimation

    Get PDF
    Abstract The quantitative estimation of rain rates using meteorological radar has been a major theme in radar meteorology and radar hydrology. The increase of interest in polarimetric radar is in part because polarization diversity can reduce the effect on radar precipitation estimates caused by raindrop size variability, which has allowed progress on radar rainfall estimation and on hydrometeorological applications. From an operational point of view, the promises regarding the improvement of radar rainfall accuracy have not yet been completely proven. The main reason behind these limits is the geometry of radar measurements combined with the variability of the spatial structure of the precipitation systems. To overcome these difficulties, a methodology has been developed to transform the estimated drop size distribution (DSD) provided by a vertically pointing micro rain radar to a profile given by a ground-based polarimetric radar. As a result, the rainfall rate at the ground is fixed at all ranges, whereas the broadening beam encompasses a large variability of DSDs. The resulting DSD profile is used to simulate the corresponding profile of radar measurements at C band. Rainfall algorithms based on polarimetric radar measurements were taken into account to estimate the rainfall into the radar beam. Finally, merit factors were used to achieve a quantitative analysis of the performance of the rainfall algorithm in comparison with the corresponding measurements at the ground obtained from a 2D video disdrometer (2DVD) that was positioned beside the micro rain radar. In this method, the behavior change of the merit factors in the range is directly attributable to the DSD variability inside the radar measurement volume, thus providing an assessment of the effects due to beam broadening

    Complexity vs. performance in granular embedding spaces for graph classification

    Get PDF
    The most distinctive trait in structural pattern recognition in graph domain is the ability to deal with the organization and relations between the constituent entities of the pattern. Even if this can be convenient and/or necessary in many contexts, most of the state-of the art classi\ufb01cation techniques can not be deployed directly in the graph domain without \ufb01rst embedding graph patterns towards a metric space. Granular Computing is a powerful information processing paradigm that can be employed in order to drive the synthesis of automatic embedding spaces from structured domains. In this paper we investigate several classi\ufb01cation techniques starting from Granular Computing-based embedding procedures and provide a thorough overview in terms of model complexity, embedding space complexity and performances on several open-access datasets for graph classi\ufb01cation. We witness that certain classi\ufb01cation techniques perform poorly both from the point of view of complexity and learning performances as the case of non-linear SVM, suggesting that high dimensionality of the synthesized embedding space can negatively affect the effectiveness of these approaches. On the other hand, linear support vector machines, neuro-fuzzy networks and nearest neighbour classi\ufb01ers have comparable performances in terms of accuracy, with second being the most competitive in terms of structural complexity and the latter being the most competitive in terms of embedding space dimensionality

    A multi-objective optimization approach for the synthesis of granular computing-based classification systems in the graph domain

    Get PDF
    The synthesis of a pattern recognition system usually aims at the optimization of a given performance index. However, in many real-world scenarios, there exist other desired facets to take into account. In this regard, multi-objective optimization acts as the main tool for the optimization of different (and possibly conflicting) objective functions in order to seek for potential trade-offs among them. In this paper, we propose a three-objective optimization problem for the synthesis of a granular computing-based pattern recognition system in the graph domain. The core pattern recognition engine searches for suitable information granules (i.e., recurrent and/or meaningful subgraphs from the training data) on the top of which the graph embedding procedure towards the Euclidean space is performed. In the latter, any classification system can be employed. The optimization problem aims at jointly optimizing the performance of the classifier, the number of information granules and the structural complexity of the classification model. Furthermore, we address the problem of selecting a suitable number of solutions from the resulting Pareto Fronts in order to compose an ensemble of classifiers to be tested on previously unseen data. To perform such selection, we employed a multi-criteria decision making routine by analyzing different case studies that differ on how much each objective function weights in the ranking process. Results on five open-access datasets of fully labeled graphs show that exploiting the ensemble is effective (especially when the structural complexity of the model plays a minor role in the decision making process) if compared against the baseline solution that solely aims at maximizing the performances

    Relaxed Dissimilarity-based Symbolic Histogram Variants for Granular Graph Embedding

    Get PDF
    Graph embedding is an established and popular approach when designing graph-based pattern recognition systems. Amongst the several strategies, in the last ten years, Granular Computing emerged as a promising framework for structural pattern recognition. In the late 2000\u2019s, symbolic histograms have been proposed as the driving force in order to perform the graph embedding procedure by counting the number of times each granule of information appears in the graph to be embedded. Similarly to a bag-of-words representation of a text corpora, symbolic histograms have been originally conceived as integer-valued vectorial representation of the graphs. In this paper, we propose six \u2018relaxed\u2019 versions of symbolic histograms, where the proper dissimilarity values between the information granules and the constituent parts of the graph to be embedded are taken into account, information which is discarded in the original symbolic histogram formulation due to the hard-limited nature of the counting procedure. Experimental results on six open-access datasets of fully-labelled graphs show comparable performance in terms of classification accuracy with respect to the original symbolic histograms (average accuracy shift ranging from -7% to +2%), counterbalanced by a great improvement in terms of number of resulting information granules, hence number of features in the embedding space (up to 75% less features, on average)

    From Beam to Chassis: How to Increase NVH Performances with an Optimized Moment of Inertia Distribution

    Get PDF
    Car weight reduction is becoming more and more important for every kind of vehicle: minor mass implies, in fact, minor consumption, makes easier to fulfill homologation rules and assures a better handling behavior. Despite that, several vehicle missions have always been solved by adding more mass, e.g. NVH. In this paper, a methodology to optimize the stiffness distribution is proposed in order to obtain better vibrational performances without increasing the mass. At first, the problem has been solved for a simple beam using finite element and optimization algorithms. At a second stage, the optimal moment of inertia distribution found has been applied to a chassis thanks to a topometry optimization. Finally, the improvement in NVH performances has been verified comparing the inertances of the optimized model with those of the non-optimized one

    Intrusion detection in wi-fi networks by modular and optimized ensemble of classifiers

    Get PDF
    4noopenWith the breakthrough of pervasive advanced networking infrastructures and paradigms such as 5G and IoT, cybersecurity became an active and crucial field in the last years. Furthermore, machine learning techniques are gaining more and more attention as prospective tools for mining of (possibly malicious) packet traces and automatic synthesis of network intrusion detection systems. In this work, we propose a modular ensemble of classifiers for spotting malicious attacks on Wi-Fi networks. Each classifier in the ensemble is tailored to characterize a given attack class and is individually optimized by means of a genetic algorithm wrapper with the dual goal of hyper-parameters tuning and retaining only relevant features for a specific attack class. Our approach also considers a novel false alarm management procedure thanks to a proper reliability measure formulation. The proposed system has been tested on the well-known AWID dataset, showing performances comparable with other state of the art works both in terms of accuracy and knowledge discovery capabilities. Our system is also characterized by a modular design of the classification model, allowing to include new possible attack classes in an efficient way.openAccademicoGiuseppe Granato; Alessio Martino; Luca Baldini; Antonello RizziGranato, Giuseppe; Martino, Alessio; Baldini, Luca; Rizzi, Antonell

    Structural optimization of automotive chassis: theory, set up, design

    Get PDF
    Improvements in structural components design are often achieved on a trial-and-error basis guided by the designer know-how. Despite the designer experience must remain a fundamental aspect in design, such an approach is likely to allow only marginal product enhancements. A different turn of mind that could boost structural design is needed and could be given by structural optimization methods linked with finite elements analyses. These methods are here briefly introduced, and some applications are presented and discussed with the aim of showing their potential. A particular focus is given to weight reduction in automotive chassis design applications following the experience matured at MilleChili Lab
    • …
    corecore